专利摘要:
A mobile terminal and a control method thereof are disclosed whereby photos / videos can be conveniently taken / made using a plurality of different preview images taken with at least two cameras. The present invention includes a first camera, a camera, a touch screen, and a controller (180) controlling the simultaneous display on the touch screen of at least a first preview image taken with the first camera and a second preview image taken with the camera in a camera photography mode, the controller (180) controlling the capture / production of an image / video corresponding to that prescribed of the preview images simultaneously displayed.
公开号:FR3024786A1
申请号:FR1554373
申请日:2015-05-15
公开日:2016-02-12
发明作者:Kyungmin Cho;Jiwon Yun;Minah Song;Taeho Kim;Seoyoung Jeong
申请人:LG Electronics Inc;
IPC主号:
专利说明:

[0001] The present invention relates to a mobile terminal, and more particularly to a mobile terminal and a control method thereof. Although the present invention is suitable for a wide range of applications, it is particularly suitable for facilitating taking / making photos / videos using a plurality of different preview images taken with at least two cameras. Generally, the terminals can be classified as mobile / portable terminals and fixed terminals. Mobile terminals may furthermore be classified into hand-held terminals and vehicle-mounted terminals with a possibility of direct portability by the user.
[0002] As the functions of the terminal diversify, the terminal tends to be implemented as a media player provided with composite functions such as taking pictures or videos, playing music or video files, games, broadcasting reception and the like , for example. To support and increase the functions of the terminal, one can consider the improvement of structural parts and / or software parts of the terminal. Recently, there has been proposed a mobile terminal type smart phone that tends to be equipped with a high performance camera. Since the mobile terminal can conveniently share an image or video taken with the camera of its own with an external terminal or the like, a frequency of use of the mobile terminal is constantly increasing. Typically, a mobile terminal of a smart phone type includes cameras disposed at the front and back surfaces of the mobile terminal, respectively. Nevertheless, a preview image taken with a camera is displayed only in a camera photography mode of a mobile terminal of the related art. In order to view a view taken with the other camera, it is impractical for a user to enter a camera switching mode. Accordingly, the present invention is directed to a mobile terminal and a control method thereof that substantially relates to one or more problems due to the limitations and disadvantages of the related art.
[0003] An object of the present invention is to provide a mobile terminal and a control method thereof, whereby images or videos can be photographed more conveniently and easily using a plurality of cameras. In particular, the object of the present invention is to provide a mobile terminal and a control method thereof, whereby images or videos can be taken more conveniently and easily using a plurality of different preview images taken with at least two cameras. Another object of the present invention is to provide a mobile terminal and a control method thereof, whereby images or videos of various types can be photographed by editing a plurality of preview images. Technical tasks obtainable by the present invention are not limited to the aforementioned technical tasks. And, other technical tasks not mentioned can be clearly understood from the following description by those skilled in the art to which the present invention relates. Additional advantages, objects and features of the invention will be set forth in part in the description which follows and will become apparent to those skilled in the art upon consideration of the following section or may be learned by practice of the invention. . The objectives and other advantages of the invention can be realized and attained by the structure reported in particular in the written description and its claims and the accompanying drawings. To achieve these objectives and other advantages and in accordance with the object of the invention, as realized and described herein in the broad sense, a mobile terminal according to an embodiment of the present invention may include a camera, a camera, a touch screen and a controller controlling the simultaneous display on the touchscreen of at least one of the preview image taken with the camera and a second preview image taken with the camera. camera in a camera photography mode, the controller controlling the capture / production of an image / video corresponding to a prescribed one of the preview images simultaneously displayed. In another aspect of the present invention, as realized and described herein in a broad sense, a method of controlling a mobile terminal according to another embodiment of the present invention may include the steps of entering a mode. camera photography, simultaneous display of a preview image taken with the camera and a second preview image taken with the camera on at least one touch screen, input of an order of photograph and taking / making an image / video corresponding to one of the prescribed preview images displayed simultaneously. Accordingly, the present invention provides the following effects and / or features. First, images or videos can be photographed more conveniently and easily using a plurality of cameras. In particular, according to the present invention, as a plurality of different preview images taken with at least two cameras are displayed simultaneously, images or videos can be photographed more conveniently and easily.
[0004] Second, the present invention provides a visual effect or image contents to a plurality of preview images as well as an image taken with a camera and is capable of photographing by editing a plurality of preview images. in a single image or video. Effects obtainable by the present invention may not be limited to the above-mentioned effect. And, other effects not shown can be clearly understood from the following description by those skilled in the art to which the present invention pertains. It should be understood that the foregoing general description and the following detailed description of the present invention are both given by way of example and explanation and are intended to further explain the invention as claimed. The accompanying drawings, which are included to give a better understanding of the invention and are incorporated in and constitute a part of this application, illustrate a mode (s) of embodiment of the invention and together with the description, serve to explain the principle of the invention. In the drawings: Fig. 1 is a block diagram for describing a mobile terminal related to the present invention; Figures 2A and 2B are different perspective view diagrams of an example of a mobile terminal related to the present invention; FIG. 3 is a flowchart of an example of a photography / image / video process by displaying a plurality of preview images taken with at least two cameras in a mobile terminal in accordance with a embodiment of the present invention; Fig. 4 is a diagram of an example of a display configuration of a plurality of preview images in a mobile terminal according to an embodiment of the present invention; Figs. 5A to 5D are diagrams of an example of a video making process using a plurality of preview images in a mobile terminal according to an embodiment of the present invention; Figs. 6A-6D are diagrams of an example of a method of switching a full screen between previews and changing a preview magnification during a multi-preview mode in a mobile terminal according to an embodiment of the present invention; Figs. 7A-7C are diagrams of an example of a display configuration of a thumbnail of an image taken in a multi-preview mode in a mobile terminal according to an embodiment of the present invention; Figs. 8A and 8D are diagrams of an example of a method of changing a given visual effect to a preview image in a mobile terminal according to an embodiment of the present invention; Figs. 9A-9B are diagrams of an example of a simultaneous display configuration of a plurality of preview images to which different configuration values are applied in a mobile terminal according to an embodiment of the present invention. ; Figs. 10A to 10D are diagrams of an example of a free layout method of a preview image in a mobile terminal according to an embodiment of the present invention; Figs. 11A-11B are diagrams of another example of a free layout method of a preview image in a mobile terminal according to an embodiment of the present invention; Figs. 12A-12C are diagrams of an example of a method of further arranging an image content together with a preview image in a mobile terminal according to an embodiment of the present invention; Figs. 13A-13C are diagrams of an example of a process of creating a third preview image using a preview image and a preview image in a mobile terminal according to an embodiment of the present invention. invention; and Figs. 14A-14B are diagrams of an exemplary process for activating a multi-preview mode in consideration of battery saving in a mobile terminal according to an embodiment of the present invention. A detailed description will now be given according to exemplary embodiments disclosed herein with reference to the accompanying drawings. For reasons of brevity of the description with reference to the drawings, the same or equivalent components may be given the same reference numbers, and their description will not be repeated. In general, a suffix such as "module" and "unit" can be used to designate elements or components. The use of such a suffix is simply here to facilitate the description of the memory, and the suffix itself is not meant to give any meaning or function. In the present disclosure, what is well known to those skilled in the art concerned has generally been omitted for the sake of brevity. The accompanying drawings are used to assist in easily understanding various technical features and it should be understood that the embodiments presented herein are not limited by the accompanying drawings. As such, the present disclosure should be construed as extending to any modifications, equivalents and substitutions in addition to those specifically set forth in the accompanying drawings. The mobile terminals presented here can be implemented using a variety of different types of terminals. Examples of such terminals include cell phones, smart phones, user equipment, laptops, digital broadcasting terminals, personal digital assistants (PDAs), portable media players (PMPs), browsers, computers PCs, slate PCs, tablet PCs, 3024786 6 ultrabooks, clothing devices (eg, smart watches, smart glasses, head-mounted phones (HMDs), and the like.) As a non-limiting example only a further description will be given with reference to particular types of mobile terminals.
[0005] However, these teachings also apply to other types of terminals, such as types noted above. In addition, these teachings can also be applied to fixed terminals such as digital TV, desktops, and the like. Reference is now made to FIGS. 1 to 2B, where FIG. 1 is a functional diagram of a mobile terminal in accordance with the present disclosure, and FIGS. 2A and 2B are conceptual views of an example of the mobile terminal, seen from different directions. According to the present invention, various functions are provided by linked operations between a mobile terminal and a clothing device. Accordingly, there is described a configuration of a watch-style device as an example of a clothing device to which the present invention is applicable. The mobile terminal 100 is shown having components such as a wireless communication unit 110, an input unit 120, a detection unit 140, an output unit 150, an interface unit 160, a memory 170 a controller 180 and a power source unit 190. It is to be understood that the implementation of all illustrated components is not a requirement, and that larger or smaller components can be implemented as an alternative. smaller number. Referring now to FIG. 1, the mobile terminal 100 is shown having a wireless communication unit 110 configured with a plurality of jointly implemented components. For example, the wireless communication unit 110 typically includes one or more components that allow wireless communication between the mobile terminal 100 and a wireless communication system or network within which the mobile terminal is located.
[0006] The wireless communication unit 110 typically includes one or more modules that enable communications such as wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal. , communications between the mobile terminal 100 and an external server. In addition, the wireless communication unit 110 typically includes one or more modules that connect the mobile terminal 100 to one or more networks. To facilitate such communications, the wireless communication unit 110 includes one or more of a broadcast receiving module 111, a mobile communication module 112, a wireless Internet module 113, a short-range communication module 114 and a location information module 115. The input unit 120 includes a camera 121 for obtaining images or video, a microphone 122, which is a type of audio input device 10 for inputting an audio signal, and a user input unit 123 (for example, a touch key, a push button, a mechanical key, a function key and the like) allowing a user to enter information. Data (eg audio, video, image and the like) is obtained by the input unit 120 and can be analyzed and processed by the controller 180 according to device parameters, user commands and their combinations. The detection unit 140 is typically implemented using one or more sensors configured to detect internal information of the mobile terminal, the surrounding environment of the mobile terminal, user information and the like. For example, in FIG. 1, the detection unit 140 is shown having a proximity sensor 141 and an illumination sensor 142. If desired, the detection unit 140 may optionally or additionally include other types of sensors or devices, such as a touch sensor, an acceleration sensor, a magnetic sensor, a G-sensor, a gyroscope sensor, a motion sensor, an RGB sensor, an infrared (IR) sensor, a finger reader sensor, an ultrasonic sensor, an optical sensor (for example a camera 121), a microphone 122, a battery gauge, an environmental sensor (for example a barometer, a hygrometer, a thermometer, a sensor radiation detector, a thermal sensor and a gas sensor, among others), and a chemical sensor (e.g., an electronic nose, a health care sensor, a biometric sensor, and the like) to name a few -uns. The mobile terminal 100 may be configured to use information obtained from the detection unit 140, and in particular, information obtained from one or more sensors of the detection unit 140, and their combinations. . Output unit 150 is typically configured to output various types of information, such as audio, video, touch, and the like. The output unit 150 is shown having a display unit 151, an audio output unit 152, a haptic module 153 and an optical output module 154. The display unit 151 may have an interlayer structure or a integrated structure with a touch sensor to facilitate a touch screen. The touch screen may provide an output interface between the mobile terminal 100 and a user, as well as operate as the user input unit 123 which provides an input interface between the mobile terminal 100 and the user. The interface unit 160 serves as an interface with various types of external devices that can be coupled to the mobile terminal 100. The interface unit 160, for example, can include any one of wired or wireless ports, external power source ports, wired or wireless data ports, memory card ports, ports for connecting a device having an identification module, input / output (I / O) ports audio, video I / O ports, earphone ports, and the like. In some cases, the mobile terminal 100 may perform various control functions associated with a connected external device, in response to the connection of the external device to the interface unit 160. The memory 170 is typically implemented to store data. in order to support various functions or features of the mobile terminal 100. For example, the memory 170 may be configured to store application programs executed in the mobile terminal 100, data or instructions for operations of the mobile terminal 100 and the like . Some of these application programs can be downloaded from an external server via wireless communication. Other application programs may be installed within the mobile terminal 100 at the time of manufacture or shipment, which is typically the case for basic functions of the mobile terminal 100 (for example, receiving a call). , make a call, receive a message, send a message and the like). It is common for the application programs to be stored in the memory 170, installed in the mobile terminal 100, and executed by the controller 180 to perform an operation (or function) for the mobile terminal 100. Controller 180 typically operates to control the overall operation of the mobile terminal 100, in addition to the operations associated with the application programs. The controller 180 may provide or process information or functions appropriate for a user by processing signals, data, information and the like, which are inputted or outputted by the various components shown in FIG. 1A, or activate programs. in the memory 170. By way of example, the control member 180 controls all or part of the components illustrated in FIGS. 1 to 2B according to the execution of an application program that has been stored in the memory 170. The power source unit 190 may be configured to receive an external power supply or provide an internal power supply to provide an appropriate power required to operate the elements and components included in the mobile terminal 100. power source unit 190 may include a battery, and the battery may be configured to be embedded in the terminal body, or configured to be e detachable from the body of the terminal. Referring again to FIG. 1, various components shown in this figure will now be described in greater detail. With respect to the wireless communication unit 110, the broadcast receiving module 111 is typically configured to receive a broadcast signal and / or information associated with broadcasting from an external broadcast management entity via a broadcast channel. The broadcast channel may include a satellite channel, a terrestrial channel, or both. In some embodiments, two or more broadcast receiving modules 111 may be used to facilitate simultaneous reception of two or more broadcast channels, or support switching among broadcast channels. The broadcast management entity may be implemented using a server or system that generates and transmits a broadcast signal and / or information associated with broadcasting, or a server that receives a pre-generated broadcast signal and / or information associated with broadcasting, and sends such items to the mobile terminal. The broadcast signal may be implemented using any of a TV broadcast signal, a radio broadcast signal, a data broadcast signal, and combinations thereof, among others. The broadcast signal may in some cases also include a data broadcast signal combined with a TV or radio broadcast signal.
[0007] The broadcast signal may be encoded according to any one of a variety of technical standards or broadcasting processes (eg International Organization for Standardization- (ISO), International Electrotechnical Commission (IEC), Digital Video Broadcasting). (DVB), the Advanced Television Systems Committee (ATSC), and the like) for transmission and reception of digital broadcast signals. The broadcast receiving module 111 may receive the digital broadcast signals using a method appropriate to the transmission method used. Examples of information associated with broadcasting may include information associated with a broadcast channel, a broadcast program, a broadcast event, a broadcast service provider, or the like. The broadcast related information may also be provided via a mobile communication network, and in this case received by the mobile communication module 112. Broadcast related information may be implemented in a variety of formats. For example, the information associated with broadcasting may include an electronic program guide (EPG) for digital multimedia broadcasting (DMB), an electronic service guide (ESG) for digital video-portable broadcasting (DVB-H) and the like. The broadcast signals and / or broadcast associated information received via the broadcast receiving module 111 may be stored in a suitable device, such as a memory 170. The mobile communication module 112 may transmit and / or receive wireless signals to and from one or more network entities. Typical examples of a network entity include a base station, an external mobile terminal, a server, and the like. Such network entities are part of a wireless communication network, which is built according to technical standards or communication methods for mobile communications (for example, the global mobile communications system (GSM), multiple access 3024786 11 in code (CDMA), CDMA2000 (Multi Access Division Code 2000), EVDO (enhanced and optimized voice / data communication or enhanced voice / data communication only), broadband CDMA (WCDMA), access through High Speed Downlink Packets (HSDPA), HSUPA (High Throughput Uplink Packet Access, Long Term Evolution (LTE), LTE-A (Enhanced Long Term Evolution), and the like). Examples of wireless signals transmitted and / or received via the mobile communication module 112 include audio calling signals, video calling (telephony) signals, or various data formats for supporting text message communication. and multimedia.
[0008] The wireless Internet module 113 is configured to facilitate wireless Internet access. This module can be coupled to the mobile terminal 100 internally or externally. The wireless Internet module 113 can transmit and / or receive wireless signals via communication networks according to wireless Internet technologies. Examples of such wireless Internet access include Wireless Local Area Network (WLAN), Wi-Fi (Wi-Fi), Wi-Fi Direct, Digital Living Network Alliance (DLNA), wireless Internet. Broadband (WiBro), Worldwide Interoperability for Microwave Access (WiMAX), HSDPA (High Speed Downlink Packet Access), HSUPA (high-speed uplink packet access), and long-term evolution ( LTE), LTE-A (enhanced long-term evolution), and the like. The wireless Internet module 113 can transmit / receive data according to one or more of these wireless Internet technologies, as well as other Internet technologies. In some embodiments, when wireless Internet access is implemented according to, for example, WiBro, HSDPA, HSUPA, GSM, CDMA, WCDMA, LTE, LTE-A and the like, as part of a network Mobile communication module, the wireless Internet module 113 achieves such access to wireless Internet. As such, the wireless Internet module 113 may cooperate with, or function as, the mobile communication module 112. The short-range communication module 114 is configured to facilitate short-range communications. Suitable technologies to implement such short-range communications include BLUETOOTHTm, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, Near Field Communication ( NFC), Wi-Fi 3024786 12 (Wi-Fi), Wi-Fi Direct, Wireless USB (Universal Wireless Serial Bus), and the like. The short-range communication module 114 generally supports wireless communications between the mobile terminal 100 and a wireless communication system, communications between the mobile terminal 100 and another mobile terminal 100, or communications between the mobile terminal. and a network where another mobile terminal 100 (or an external server) is located, via wireless local area networks. An example of wireless local area networks is a personal wireless LAN. In some embodiments, another mobile terminal (which may be configured similarly to the mobile terminal 100) may be a clothing device, for example a smart watch, smart glasses or a head-mounted camera (HMD), which can exchange data with the mobile terminal 100 (or otherwise cooperate with the mobile terminal 100). The short-range communication module 114 can detect or recognize the clothing device, and allow communication between the clothing device and the mobile terminal 100. In addition, when the detected clothing device is a device that is authenticated to communicate with the terminal mobile 100, the controller 180, for example, can cause a processed data transmission in the mobile terminal 100 to the clothing device via the short-range communication module 114. From there, a user of the clothing device can use the data Processed in the mobile terminal 100 on the clothing device. For example, when a call is received in the mobile terminal 100, the user can answer the call using the clothing device. Similarly, when a message is received in the mobile terminal 100, the user can check the received message using the clothing device.
[0009] The location information module 115 is generally configured to detect, calculate, derive or otherwise identify a position of the mobile tenainal. For example, the location information module 115 includes a global positioning system (GPS) module, a Wi-Fi module, or both. If desired, the location information module 115 may alternatively or additionally operate with any of the other modules of the wireless communication unit 110 to obtain data relating to the position of the mobile terminal. For example, when the mobile terminal uses a GPS module, a position of the mobile terminal can be acquired using a signal sent from a GPS satellite. As another example, when the mobile terminal uses the Wi-Fi module, a position of the mobile terminal can be acquired based on information relating to a wireless access point (AP) which transmits or receives a signal wirelessly to or from the Wi-Fi module.
[0010] The input unit 120 may be configured to allow various types of input to the mobile terminal 120. Examples of such inputs include audio, picture, video, data and user input. An image and video input is often obtained using one or more cameras 121. Such cameras 121 may process still image or video image frames obtained by image sensors in a video capture mode. or image. The processed image frames may be displayed on the display unit 151 or stored in the memory 170. In some cases, the cameras 121 may be arranged in a matrix configuration to allow a plurality of images having various angles. or focal points to be entered in the mobile terminal 100. As another example, the cameras 121 may be located in a stereoscopic arrangement for acquiring left and right images for implementing a stereoscopic image. The microphone 122 is generally implemented to allow audio input into the mobile terminal 100. The audio input may be processed in a variety of ways according to a function performed in the mobile terminal 100. If desired, the microphone 122 may include algorithms various noise suppressors to suppress unwanted noise generated during the reception of external audio. The user input unit 123 is a component that allows input by a user. Such user input may allow the controller 180 to control the operation of the mobile terminal 100. The user input unit 123 may include one or more of a mechanical input element (e.g. button, a button located on a front and / or rear surface or a side surface of the mobile terminal 100, a dome switch, a hand wheel, a hand switch and the like), or a touch sensitive input, among others. For example, the touch sensitive input may be a virtual key or a function key, which is displayed on a touch screen by software processing, or a touch key which is located on the mobile terminal at a location which is other than on the touch screen. In addition, the virtual key or the visual key 3024786 14 can be displayed on the touch screen in various forms, for example graphic, text, icon, video or one of their combinations. The detection unit 140 is generally configured to detect one or more of internal information of the mobile terminal, information of the surrounding environment of the mobile terminal, user information or the like. The controller 180 generally cooperates with the sending unit 140 to control the operation of the mobile terminal 100, or to execute a data processing, a function or an operation associated with an application program installed in the mobile terminal. based on the detection provided by the detection unit 140. The detection unit 140 may be implemented using any of a variety of sensors, some of which will now be described in more detail. The proximity sensor 141 may include a sensor for detecting the presence or absence of an object approaching a surface, or an object located near a surface, using an electromagnetic field, infrared rays, or the like. without mechanical contact. The proximity sensor 141 may be arranged at an inner region of the mobile terminal covered by the touch screen, or near the touch screen. The proximity sensor 141 may for example include any of a transmissive sensor type photoelectric sensor, a direct reflective type photoelectric sensor, a mirror reflective type photoelectric sensor, a proximity sensor. high frequency oscillation, capacitance proximity sensor, magnetic proximity sensor, infrared proximity sensor and the like. When the touch screen is implemented as a capacitance type, the proximity sensor 141 can detect the proximity of a pointer to the touch screen by changes in an electromagnetic field, which responds to the approach of an object having a conductivity. In this case, the touch screen (touch sensor) can also be categorized as a proximity sensor.
[0011] The term "proximity touch" will often be used here to refer to the scenario in which a pointer is positioned to be close to the touch screen without coming into contact with the touch screen. The term "contact touch" will often be used here to refer to the scenario in which a pointer 3024786 makes physical contact with the touch screen. For the position corresponding to the proximity touch of the pointer relative to the touch screen, such a position will correspond to a position where the pointer is perpendicular to the touch screen. The proximity sensor 141 can detect a proximity touch, and proximity touch patterns (e.g. distance, direction, speed, time, position, travel status, and the like). In general, the controller 180 processes data corresponding to proximity taps and proximity tap patterns detected by the proximity sensor 141, and results in the output of visual information on the touch screen. In addition, the controller 180 may control the mobile terminal 100 to perform different operations or process different data depending on whether a touch with respect to a point on the touch screen is either a proximity touch or a touch contact. . A touch sensor can detect a touch applied to the touch screen, such as the display unit 151, using any of a variety of touching methods. Examples of such touch methods include a resistive type, a capacitive type, an infrared type, and a magnetic field type, among others. As an example, the touch sensor may be configured to convert pressure changes applied to a specific portion of the display unit 151, or to convert a capacitance occurring at a specific portion of the display unit. 151, as electrical input signals. The touch sensor can also be configured to detect not only an affected position and an affected area, but also touch pressure and / or touch ability. A touch object is generally used to apply a touch input to the touch sensor.
[0012] Examples of typical touch objects include a finger, a pencil, a stylus, a pointer, or the like. When a touch input is detected by a touch sensor, corresponding signals can be transmitted to a touch controller. The touch controller can process the received signals and then transmit corresponding data to the controller 180. Accordingly, the controller 180 can detect which region of the display unit 151 has been affected. Here, the touch control member may be a separate component of the controller 180, combined with the controller 180, and combinations thereof.
[0013] In some embodiments, the controller 180 may perform the same or different commands depending on a type of touch object that touches the touch screen or a touch key provided in addition to the touch screen. It can be decided whether to execute the same command or a different command depending on the object that provides tactile input based on a current operating state of the mobile terminal 100 or an application program currently running, for example. The touch sensor and the proximity sensor can be implemented individually, or in combination, to detect various types of touch. Such touches include a short touch, a long touch, a multiple touch, a sliding feel, a light touch, a pinch inward, a pinch to the outside, a touch by scanning, a touch of pointing and the like. If desired, an ultrasonic sensor may be implemented to recognize positional information relating to a touch object using ultrasonic waves. The controller 180 may for example calculate a position of a wave generation source based on information detected by an illumination sensor and a plurality of ultrasonic sensors. Since the light is much faster than the ultrasonic waves, the time required for the light to reach the optical sensor is much shorter than the time required for the ultrasonic wave to reach the ultrasonic sensor. The position of the wave generation source can be calculated using this fact. For example, the position of the wave generation source can be calculated using the time difference from the time required for the ultrasonic wave to reach the sensor based on light as a reference signal. The camera 121 typically includes at least one camera sensor (CCD, CMOS, etc.), a photosensitive sensor (or image sensors) and a laser sensor. The implementation of the camera 121 with a laser sensor can enable the detection of a touch of a physical object with respect to a stereoscopic image in 3D. The photosensitive sensor may be laminated on, or overlapped by, the display device. The photosensitive sensor may be configured to scan the movement of the physical object near the touch screen. In more detail, the photosensitive sensor may include photodiodes and row-level and 17-column transistors for scanning the received content at the photosensitive sensor using an electrical signal that changes according to the amount of light applied. That is, the photosensitive sensor can calculate the coordinates of the physical object according to a variation of light to thereby obtain position information of the physical object. The display unit 151 is generally configured to output processed information in the mobile terminal 100. For example, the display unit 151 may display run screen information of an application program. executing at the mobile terminal 100 or the user interface (UI) and graphical user interface (GUI) information in response to the execution screen information. In some embodiments, the display unit 151 may be implemented as a stereoscopic display unit for displaying stereoscopic images. A typical stereoscopic display unit may employ a stereoscopic display scheme such as a stereoscopic scheme (a goggle scheme), an autostereoscopic scheme (a schematic without glasses), a projection scheme (holographic scheme) or the like. In general, a stereoscopic 3D image may include a left image (e.g., a left eye image) and a right image (e.g., a right eye image). Depending on how the left and right images are combined into a stereoscopic 3D image, a 3D stereoscopic imaging process can be divided into a high-low process in which left and right images are located at the top and bottom in a frame, and a G-to-D method (left to right or side by side) in which left and right images are located to the left and right in a frame, a checkerboard method in which left image fragments and right are localized as a mosaic, an interlaced process in which left and right images are alternately located in columns or lines, and a time-sequential (or frame-by-frame) method in which left and right images are displayed alternately on a time basis.
[0014] Likewise, for a 3D thumbnail image, a left thumbnail and a right thumbnail can be generated from a left image and a right image of an original image frame, respectively, and then combined to generate a single thumbnail image in 3D. In general, the term "vignette" may be used to denote a reduced image or a reduced fixed image. A left thumbnail and a right thumbnail generated can be displayed with a difference in horizontal distance between them of a depth corresponding to the disparity between the left image and the right image on the screen, thus conferring a stereoscopic sense in 5 l 'space. A left image and a right image required to implement a stereoscopic 3D image can be displayed on the stereoscopic display unit using a stereoscopic processing unit. The stereoscopic processing unit may receive the 3D image and extract the left image and the right image, or may receive the 2D image and change it into a left image and a right image. The audio output module 152 is generally configured to output audio data. Such audio data may be obtained from any one of a number of different sources, so that audio data may be received from the wireless communication unit 110 or may have been stored in the memory 170. The audio data may be output during modes such as a signal receiving mode, a calling mode, a recording mode, a speech recognition mode, a broadcast receiving mode and the like . The audio output module 152 may provide an audible output relating to a particular function (e.g., a receive signal receiving sound, a message receiving sound, etc.) performed by the mobile terminal 100. The module audio output 152 may also be implemented as a receiver, a speaker, a buzzer or the like. A haptic module 153 may be configured to generate various tactile effects that a user feels, perceives or otherwise experiences. A typical example of a tactile effect generated by the haptic module 153 is a vibration. The intensity, pattern, and the like of the vibration generated by the haptic module 153 may be controlled by a user selection or by adjustment by the control unit. For example, the haptic module 153 can output different vibrations in a combined or sequential manner.
[0015] In addition to the vibration, the haptic module 153 can generate various other tactile effects, including a stimulating effect such as a vertically movable spindle arrangement for contacting the skin, a spraying force, or an air suction force. through a jet orifice or suction opening, a skin feel, an electrode contact, an electrostatic force, an effect reproducing the sensation of cold and heat by using an element that can absorb or generate heat, and the like. The haptic module 153 may also be implemented to allow the user to experience a tactile effect through a muscular sensation such as the user's fingers or arm, as well as by transferring the tactile effect. by direct contact. Two or more haptic modules 153 may be provided according to the particular configuration of the mobile terminal 100. An optical output module 154 may output a signal to indicate event generation using light from a light source . Examples of events generated in the mobile terminal 100 may include message reception, call waiting, a missed call, an alarm, a calendar announcement, an e-mail reception, the receipt of information to through an application, and the like.
[0016] A signal outputted from the optical output module 154 may be implemented for the mobile terminal to emit monochromatic light or light of a plurality of colors. The signal output can be terminated when the mobile terminal detects that a user has verified the generated event, for example. The interface unit 160 serves as an interface for external devices to connect to the mobile terminal 100. For example, the interface unit 160 can receive data transmitted from an external device, receive power for transfer to elements and components within the mobile terminal 100, or transmit internal data of the mobile terminal 100 to such an external device. The interface unit 160 may include wired or wireless headphone ports, external power source ports, wired or wireless data ports, memory card ports, ports for connection to a computer. a device having an identification module, audio input / output (I / O) ports, video I / O ports, headphone ports, or the like. The identification module may be a chip which stores various information for authenticating the right of use of the mobile terminal 100 and may include a user identity module (UIM), a subscriber identity module (SIM) ), a universal subscriber identity module (USIM) and the like. In addition, the device comprising the identification module (also referred to herein as the "identifier device") may take the form of a smart card. As a result, the identifier device can be connected to the terminal 100 via the interface unit 160. When the mobile terminal 100 is connected to an external cradle, the interface unit 160 can be used as a passageway to enable the delivery of the cradle. supplying the cradle to the mobile terminal 100 or may be used as a passageway for transferring various order signals entered by the user from the cradle to the mobile terminal. Various command signals or the power input from the cradle may function as signals to recognize that the mobile terminal is properly mounted on the cradle. The memory 170 may store programs to support operations of the controller 180 and store input / output data (eg, phone book, messages, still images, videos, etc.). The memory 170 can store data relating to various vibration and audio patterns that are output in response to touch inputs on the touch screen. The memory 170 may include one or more types of storage media including a flash memory, a hard disk, a semiconductor disk, a silicon disk, a micro type of multimedia card, a memory card (for example memory SD or DX, etc.), random access memory (RAM), static random access memory (SRAM), read only memory (ROM), electrically erasable programmable read only memory (EEPROM), programmable read only memory (PROM), magnetic memory, a magnetic disk, an optical disk and the like. The mobile terminal 100 may also be operated in connection with a network storage device that performs the storage function of the memory 170 on a network, such as the Internet. The controller 180 may typically control the general operations of the mobile terminal 100. For example, the controller 180 may set or release a lockout state to restrict a user's entry of a control order relative to to applications when a status of the mobile terminal satisfies a pre-established condition. The controller 180 may also perform control and processing associated with voice calls, data communications, video calls, and the like, or perform pattern recognition processing to recognize a handwritten input or a drawn input. performed on the touch screen in the form of characters (or word) or images, respectively. In addition, the controller 180 may control a component or combination of these components to implement various embodiments disclosed herein.
[0017] The power source unit 190 receives an external power supply or provides an internal power supply and provides the appropriate power required to operate the respective elements and components included in the mobile terminal 100. The power source unit 190 can include a battery, which is typically rechargeable or is detachably coupled to the terminal body for charging. The power source unit 190 may include a connection port. The connection port may be configured as an example of the interface unit 160 to which an external charger to provide power to recharge the battery is electrically connected.
[0018] As another example, the power source unit 190 may be configured to recharge the wireless battery without the use of the connection port. In this example, the power source unit 190 can receive power, transferred from an external wireless power transmitter, using one of an inductive coupling method that is based on magnetic induction. or a magnetic resonance coupling method which is based on electromagnetic resonance. Various embodiments described herein may be implemented in a computer readable medium, a machine readable medium or a similar medium using, for example, software, hardware, or a combination thereof.
[0019] Referring now to FIGS. 2A and 2B, the mobile terminal 100 is described with reference to a bar-type body of the terminal. However, the mobile terminal 100 may alternatively be implemented in any of a variety of different configurations. Examples of such configurations include watch type, fastener type, spectacle type, or as collapsible type, flapper type, sliding type, tilting type, and swivel type in which two and more than two bodies are combined together. to others in a relatively mobile way, and their combinations. The present discussion is often concerned with a particular type of mobile terminal (for example, a bar type, a watch type, a spectacle type and the like).
[0020] Nevertheless, such teachings as to a particular type of mobile terminal will also generally apply to other types of mobile terminals. The mobile terminal 100 will generally include a housing (for example a frame, a housing, a shell and the like) forming the appearance of the terminal. In this embodiment, the housing is formed using a front housing 101 and a rear housing 102. Various electronic components are incorporated in a space formed between the front housing 101 and the rear housing 102. less a middle case can furthermore be positioned between the front case 101 and the rear case 102. The display unit 151 is shown located on the front side of the terminal body for outputting information. As illustrated, a window 151a of the display unit 151 may be mounted on the front housing 101 to form the front surface of the terminal body together with the front housing 101. In some embodiments, electronic components may also be The examples of such electronic components include a detachable battery 191, an identification module, a memory card and the like. The back shell 103 is shown covering the electronic components, and this shell can be detachably coupled to the back box 102. As a result, when the back shell 103 is detached from the back box 102, the electronic components mounted on the back box 102 are 20 exposed outside. As illustrated, when the rear shell 103 is coupled to the rear housing 102, a side surface of the rear housing 102 is partially exposed. In some cases, during coupling, the back box 102 may be completely obscured by the back shell 103. In some embodiments, the back shell 103 may include an opening for exposing a camera 121b or audio output unit 152b. The housings 101, 102, 103 may be formed by injection molding of a synthetic resin or may be formed of a metal, for example stainless steel (AI), aluminum (Al), titanium (Ti) or similar.
[0021] As an alternative to the example in which the plurality of housings form an internal space for housing components, the mobile terminal 100 may be configured such that a housing forms the internal space. In this example, a mobile terminal 100 having a monocoque is formed so that a synthetic resin or metal extends from a side surface to a back surface. If desired, the mobile terminal 100 may include a water sealing unit to prevent the introduction of water into the terminal body.
[0022] For example, the water sealing unit may include a water seal which is located between the window 151a and the front housing 101, between the front housing 101 and the rear housing 102, or between the rear housing 102 and the back shell 103, to hermetically seal an internal space when these housings are coupled. Figs. 2A and 2B show certain components as arranged on the mobile terminal. Nevertheless, it should be understood that alternative arrangements are possible and fall within the teachings of the present disclosure. Such components may be omitted or rearranged. For example, the first handling unit 123a may be located on another surface of the terminal body, and the second audio output module 152b may be located on the side surface of the terminal body. The display unit 151 outputs information processed in the mobile terminal 100. The display unit 151 may be implemented using one or more suitable display devices. Examples of such suitable display devices include a liquid crystal display (LCD), a thin-film transistor (TFT-LCD) liquid crystal display, an organic light-emitting diode (OLED), a flexible display, a display in 3 dimensions (3D), an electronic ink display, and their combinations. The display unit 151 may be implemented using two display devices, which may implement the same or different display technology. For example, a plurality of display units 151 may be arranged on one side, spaced from each other, or these devices may be integrated, or these devices may be arranged on different surfaces. The display unit 151 may also include a touch sensor which detects a touch input received at the display unit. When a touch is input to the display unit 151, the touch sensor may be configured to detect that touch and the controller 180 may for example generate a control command or other signal corresponding to the touch. The content that is entered by touch may be a text or numeric value, or a menu item that may be indicated or designated in various modes. The touch sensor may be configured as a film having a touch pattern, disposed between the window 151a and a display on a rear surface of the window 151a, or a metal wire that is patterned directly on the back surface. from the window 151a. Alternatively, the touch sensor can be integrally formed with the display. For example, the touch sensor may be disposed on a substrate of the display or within the display. The display unit 151 may also form a touch screen together with the touch sensor. Here, the touch sensor can serve as a user input unit 123 (see Figure 1). As a result, the touch screen can replace at least a portion of the functions of the first handling unit 123a. The first audio output module 152a may be implemented as a speaker to output voice audio, alarm sounds, multimedia audio playback, and the like. The window 151a of the display unit 151 will typically include an aperture to allow the audio generated by the first audio output unit 152a to pass. An alternative is to allow the release of audio along an assembly gap between the structural bodies (for example a gap between the window 151a and the front housing 101). In this case, an independently formed hole for outputting audio sounds may not be seen or otherwise obscured in appearance, further simplifying the appearance and fabrication of the mobile terminal 100. The optical output module 154 may be configured to exit the light to indicate the generation of an event. Examples of such events include message reception, call waiting reception, missed call, alarm, calendar announcement, email reception, receipt of information through an application, and the like. . When a user has verified a generated event, the control unit can control the optical output module 154 to stop the light output.
[0023] The first camera 121a can process image frames as well as still or moving images obtained by the image sensor in a capture mode or a video call mode. The processed image frames may then be displayed on the display unit 151 or stored in the memory 170.
[0024] The first and second handling units 123a and 123b are examples of the user input unit 123, which can be manipulated by a user to provide an input to the mobile terminal 100. The first and second handling units 123a and 123b may also be commonly referred to as a handling portion, and may employ any tactile method that allows the user to perform manipulation such as touch, push, scroll, or the like. The first and second handling units 123a and 123b may also employ any non-touch method that allows the user to perform manipulation such as proximity touch, pointing, or the like. Figure 2A illustrates the first handling unit 123a as a touch key, but possible variants include a mechanical key, a push button, a touch key, and combinations thereof. An input received at the first and second handling units 123a and 123b can be used in various ways. For example, the first handling unit 123a may be used by the user to provide menu entry, a start, cancel, search, or the like key, and the second handling unit 123b may be used by the user for providing an input for controlling a volume level outputted by the first or second audio output unit 152a or 152b to switch to a touch recognition mode of the display unit 151, or the like. As another example of the user input unit 123, a rear input unit may be located on the rear surface of the terminal body. The rear input unit may be manipulated by a user to provide an input to the mobile terminal 100. The input may be used in a variety of different ways. For example, the rear input unit may be used by the user to provide a start / stop input, start, end, scroll, volume level control output by the first or second audio output unit 152a or 152b , switching to a touch recognition mode of the display unit 151, and the like. The rear input unit may be configured to allow touch input, push input, or combinations thereof. The rear input unit may be located to overlap the display unit 151 on the front side in a direction of the thickness of the terminal body.
[0025] By way of example, the rear input unit may be located on an upper end portion of the rear side of the terminal body so that a user can easily manipulate it using the index when the user enters the body of the terminal by hand. Alternatively, the rear input unit may be positioned at virtually any location on the rear side of the terminal body. Embodiments that include the rear input unit may implement all or part of the functionality of the first handling unit 123a in the rear input unit. As such, in situations where the first handling unit 123a is omitted from the front side, the display unit 151 may have a larger screen. As a further alternative, the mobile terminal 100 may include a finger reader sensor that scans a user's fingerprint. The controller 180 can then use fingerprint information detected by the finger reader sensor as part of an authentication procedure. The finger reader sensor may also be installed in the display unit 151 or implemented in the user input unit 123. The microphone 122 is shown located at one end of the mobile terminal 100, but other locations are possible. If desired, multiple microphones may be implemented, such an arrangement for receiving stereo sounds. The interface unit 160 may serve as a path for the mobile terminal 100 to interface with external devices. For example, the interface unit 160 may include one or more of a connection terminal for connection to another device (e.g., an earphone, an external speaker, or the like), a port for in-field communication near (for example, an infrared data association port (IrDA), a Bluetooth port, a wireless LAN port and the like), or a power source terminal for supplying power to the mobile terminal 100. The unit interface 160 may be implemented in the form of a card for receiving an external card, such as a subscriber identification module (SIM), a user identity module (UIM), or a memory card for storing information. The second camera 121b is shown located at the rear side of the terminal body and includes an image capture direction which is substantially opposite to the image capture direction of the first camera unit 121a. If desired, the second camera 121a may alternatively be located at other locations, or may be moved, to have an image capture direction different from that shown.
[0026] The second 12 lb camera may include a plurality of lenses arranged along at least one line. The plurality of lenses may also be arranged in a matrix configuration. The cameras can be referred to as "network camera". When the second camera 121b is implemented as a networked camera, images can be captured in various ways using the plurality of lenses and images with better qualities. As shown in FIG. 2B, a flash 124 is shown adjacent to the second camera 121b. When an image of a subject is captured with the camera 121b, the flash 124 may illuminate the subject. As shown in FIG. 2A, the second audio output unit 152b may be located on the body of the terminal. The second audio output unit 152b may implement stereophonic sound functions together with the first audio output unit 152a, and may also be used to implement a speakerphone mode for call communication. At least one antenna for wireless communication may be located on the body of the terminal. The antenna can be installed in the body of the terminal or formed by the housing. For example, an antenna that configures a portion of the broadcast receiving module 111 may be retractable into the body of the terminal. Alternatively, an antenna may be formed using a film attached to an inner surface of the back shell 103, or a housing that includes a conductive material.
[0027] A power source unit 190 for supplying power to the mobile terminal 100 may include a battery 191, which is mounted in the terminal body or detachably coupled to an outside of the terminal body. The battery 191 can receive power via a power source cable connected to the interface unit 160. Also, the battery 191 can be wirelessly recharged using a wireless charger. The wireless load can be implemented by magnetic induction or electromagnetic resonance. The back shell 103 is shown coupled to the back box 102 to mask the battery 191, to prevent separation of the battery 191, and to protect the battery 191 from external impact or a foreign object. . When the battery 191 is detachable from the terminal body, the rear case 103 may be detachably coupled to the rear case 102. An accessory to protect an appearance or to assist or extend the functions of the mobile terminal 100 may also be provided. be provided on the mobile terminal 100. As an example of an accessory, a shell or holster for covering or accommodating at least one surface of the mobile terminal 100 can be formed. The shell or holster can cooperate with the display unit 151 to extend the function of the mobile terminal 100. Another example of the accessory is a touch pen for assisting or expanding a touch input to a touch screen. Simultaneous display and editing of a plurality of preview images taken with at least two cameras A method according to an embodiment of the present invention is provided as follows. First, a plurality of preview images taken with at least two cameras are displayed simultaneously. Secondly, an image / video is taken / made using a plurality of the edited or intact displayed images. Such a method is described in detail with reference to Figure 3 as follows. FIG. 3 is a flowchart of an example of a process of photographing / imaging an image by displaying a plurality of preview images taken with at least two cameras in a mobile terminal in accordance with a embodiment of the present invention. Referring to Figure 3, one can enter a multi-preview mode [S310]. In this case, the multi-preview mode may mean a mode for taking / making an image / video using a plurality of cameras available to a mobile terminal. The multi-preview mode may include a default enabled mode in response to the execution of a camera application. Alternatively, the multi-preview mode may be activated in response to an entry of a prescribed order while a normal photography mode is activated. Of course, the multi-preview mode may have an existing application configuration separately from a general camera application. When entering the multi-preview mode, the controller 180 activates the front camera 121a and the rear camera 12b together, and then 3024786 29 is able to simultaneously display preview images taken with the two cameras 121a and 121b1 on the camera. touch screen [S320]. Optionally, if an additional camera is additionally included, a preview image taken with the additional camera may also be displayed. In doing so, each of the preview images displayed may include a preview image taken with each of the cameras, a preview image having different preview images arranged to overlap one or the other, a preview image having an effect prescribed visual given to normal, or similar, preview images.
[0028] A plurality of the preview images displayed simultaneously can be edited in response to a user order entry [S330]. In doing so, editing may include enlarging / reducing each preview image, adjusting the number of previews displayed simultaneously, rendering a visual effect, changing the layout state of previews displayed, 15 changing a presence or non-presence of a real-time update of a preview image and the like. Each editing and configuration process will be described in detail later. Finally, the controller 180 may control the capture / production of an image / video in response to a type of photograph order entered by a user [S350]. Fig. 4 is a diagram of an example of a display configuration of a plurality of preview images in a mobile terminal according to an embodiment of the present invention. Referring to FIG. 4, in a multi-preview mode, a preview image 410 taken with the rear camera 121b, a preview image 420 taken with the front camera 121a and a synthetic preview image 430, can be simultaneously displayed. which image 420 'reduced compared to the preview image taken with the front camera is displayed within the preview image taken with the rear camera, on the touch screen 151. Of course, in the case where another camera is provided additionally to the mobile terminal in addition to the front and rear cameras, more combination preview images can be displayed than the preview images 3024786 shown in FIG. 4. Even though only the front and rear cameras are provided, it will be said that only two preview images can be displayed. In addition, one can change the order of arrangement or type of preview images. To clarify the following description, a preview image taken with the camera before 121a will be named a preview, a preview image taken with the rear camera 121b will be named a 2nd preview, and an image created from the arrangement of the preview and 2nd preview so that they overlap each other in part at least will be named a 3rd preview.
[0029] If a prescribed preview image is selected from the state shown in Fig. 4 through a touch input, an image of a corresponding preview may be taken and saved in the memory 170. In addition, if a long touch is applied to a prescribed preview or a touch is repeatedly applied to a prescribed preview in a prescribed time, a video of the corresponding preview can be made. At the same time, by extending the method of making video and performing a single video by a multitoucher, an image assumed to be included in the video can be switched among a plurality of preview images. This is described in detail with reference to Figures 5A-5D as follows.
[0030] FIGS. 5A-5D are diagrams of an example of a video making process using a plurality of preview images in a mobile terminal according to an embodiment of the present invention. Referring to FIG. 5A, when a preview 520, a second preview 510, and a third preview 530 are displayed simultaneously in a multi-preview mode, if a user touches the preview 520, the command 180 begins to record an image corresponding to the preview 520 as a video. During recording, referring to FIG. 5B, if the user touches the 2nd preview 510 with a finger for a predetermined time (i.e., a recognized time with the long touch) maintaining the touch on the preview 520, an image corresponding to the 3rd preview 530 can be recorded starting from a moment (for example 10 seconds after the beginning of the recording) recognized with the long touch instead of the 3024786 31 lre preview 520. After that, referring to Fig. 5C, if the touch is released from the preview only, an image corresponding to the 2nd preview may be recorded as video from a moment (eg 20 seconds after the start of recording) releasing the corresponding touch 5. In doing so, if the touch is released from the 2nd preview, the video recording may end. Finally, referring to Figure 5D, the video recorded by the method described above can be recorded as follows. First, the image corresponding to the first preview is initially displayed. Secondly, when a playback time reaches 10 seconds, the image corresponding to the third preview begins to be displayed. Finally, an image corresponding to the 2nd video starts to be displayed from the playback time of 20 seconds until the video ends. Hence, if a user simply changes a touch-input-maintained preview without separate camera switching order or separate editing, a user can create a single video file by conveniently selecting / changing a preview image that is desired to be recorded anytime. by checking images of various previews in real time. Figs. 6A-6D are diagrams of an example of a method of switching a full screen between previews and changing a preview magnification during a multi-preview mode in a mobile terminal according to an embodiment of the present invention. Referring to FIG. 6A, while a user touches two different points on a preview (for example a 2nd preview 610) that one wishes to view in full screen with a pointer 620 during an activated multi-preview, if user applies a slip by increasing a distance between two touch points for example, a 2nd preview can be displayed in full screen [Figure 6B]. In doing so, if a touch of the same pattern as shown in Fig. 6A is applied again, the preview may be displayed by zooming in on the touch points [Fig. 6C]. On the other hand, while 30 different points on the preview in the zoom-in state are touched with the 620 pointer, if a drag is applied by increasing a distance between the touch points, the preview in the zoom-in state can be displayed by zooming out.
[0031] 3024786 32 At the same time, when the preview is displayed in full screen, if a slip is applied by decreasing the distance between the two touch points while maintaining the touch on the two different points [figure 6D], the touch can return to the state shown in Figure 6A.
[0032] Figs. 7A-7C are a diagram of an example of a display configuration of an image thumbnail taken in multi-preview mode in a mobile terminal according to an embodiment of the present invention. Referring to Fig. 7A, if a user touches a second preview image 710 with a pointer 720, an image corresponding to the second preview image 710, which has been displayed at the touch input, can be taken and stored in the memory 170. Simultaneously, a portion 711 'of a thumbnail 711 of the captured image may be displayed below the second preview image 710. In doing so, if a touch input with a pointer is applied again at the second preview image 710, with reference to Fig. 7B, a portion 712 'of a thumbnail 712 of a newly taken image is created below the previewing image 710 and the first thumbnail 711 of the image taken before is pushed down. In the case where a user intends to check the thumbnails of the images taken, with reference to Fig. 7C, if the user is dragging a portion below the 2nd preview 710 (or a portion of the displayed thumbnail) ) upwards, the 2nd preview 710 and the thumbnails 711 and 712 are scrolled upwards so that an additional portion 712 "of the thumbnail 712 of the newly taken image can be displayed progressively. the thumbnail 711 of the initially taken image may be fully displayed in relation to a length of a sliding touch input If a thumbnail image is touched, a gallery application is executed to display the corresponding image in full screen According to the description with reference to FIG. 7, a configuration or a type of display of thumbnail images of the images taken focusing on a second preview is described for example. It goes without saying that a similar thumbnail display method is applicable to a preview or a third preview. At the same time, according to an embodiment of the present invention, a preview image may be displayed so that a visual effect (e.g., a filter effect, an HDR effect, an inversion effect color, a sepia tone effect, a black-and-white effect, etc.) or a photograph configuration value (e.g., exposure compensation, aperture value, use or non-use of a flash , white balance, etc.) is applied to the corresponding preview image. And, a type of such effect or configuration value can be conveniently changed. This is described in detail with reference to Figs. 8A-8D as follows. Figs. 8A-8D are diagrams of an example of a method of changing a given visual effect to a preview image in a mobile terminal according to an embodiment of the present invention. Referring to FIG. 8A, in a multi-preview mode, a user may drag a preview (e.g. a second preview image 810), which the user wishes to check, down. From there, as the second preview image 810 moves downward, flags 820 indicating configuration values can be displayed above the second preview image 810. In doing so, if the user is dragging again the corresponding preview image down, it is able to verify a result following the application of various pre-established configuration values to the corresponding preview.
[0033] For example, if the first preview image 830 is dragged down twice, with reference to FIG. 8B, a plurality of the preview images 831 and 833 to which different configuration values are applied can be displayed. In this case, the preview image 831 on the far left side may include an image to which the same configuration values of the preview image 830 displayed in the multi-preview mode apply. And, indicators 841 and 842 corresponding to the configuration values applied to the corresponding preview image can be displayed respectively next to the corresponding preview images. In doing so, if the user slips a preview image 831 to another preview image 832, the configuration values applied to the two preview images can be applied to a preview image 831. From there, referring to In FIG. 8C, the configuration value indicators 841 and 842 applied to the two previous preview images 3024786 34 are displayed together as configuration value indicators 841 'of the corresponding preview image 831. When the configuration values are applied together, if a collision occurs, the image 831 slid into the situation shown in FIG. 8B can return to its previous location with a visual rebound effect. Referring to Fig. 8D, if a specific preview image 831 'is displayed in full screen, each time a touch with light shot is entered in a width direction, a plurality of predetermined configuration values are changed to to be applied to a corresponding preview image.
[0034] The aforementioned preview images to which different configuration values are applied can be displayed together in the multi-preview mode. This is described in detail with reference to Figs. 9A to 9B as follows. FIGS. 9A-9B are diagrams of an example of a simultaneous display configuration of a plurality of preview images to which different configuration values are applied in a mobile terminal according to an embodiment of the present invention. . Referring to FIG. 9A, in addition to 3 basic preview images 911, 921 and 931 in the multi-preview mode, sub-preview images 912, 922 and 932, to which different configuration values are applied than preview images, respectively, can be displayed simultaneously. In this case, the preview images 911, 921 and 931 correspond to the sub-picture images 912, 922 and 932, respectively. In particular, a preview image includes a sub-preview image. Referring to Fig. 9B, a plurality of sub-preview images to which different configuration values are applied can be further displayed for each of the 1-3 preview images. In particular, a plurality of sub-preview images may be displayed in different sizes depending on a frequency of use, respectively. For example, a plurality of sub-preview images, to which different configuration values for the preview image 911 are applied, may be displayed below the preview image 911. In doing so, the image of 3024786 sub-preview 913 to which a most frequently used configuration value is applied may be displayed in a size larger than that of the sub-preview image 914 at which the configuration value used is relatively smaller than that of the image sub-preview 913 is applied.
[0035] In addition, when there are sub-preview images 915 whose number is larger than the number viewable on a single screen, if a user slides a corresponding preview column upwards to scroll through the preview images. , 915 preview images not currently displayed can be displayed. Of course, as mentioned in the foregoing description with reference to Figs. 7A-7C, if a thumbnail of a captured image is set to be displayed below a preview image, the thumbnail of the taken image can be displayed below a corresponding sub-preview image. At the same time, preview images with the exception of the basic preview images shown in Figure 9 may include a real-time preview image or a thumbnail image created using an initial preview image at the time of writing. entered into a multi-preview mode. If such a preview image is a thumbnail image, it can be updated at prescribed times.
[0036] According to one embodiment of the present invention, a following method is provided. Firstly, images and / or preview / subvisualization contents, which are not preview images, are freely arranged. Second, an image / video is taken / made as an arrangement. For the sake of clarity, a method of performing such a function will be referred to as a "glue mode". You can enter the paste mode by entering an order such as a prescribed menu operation or the like in the multi-preview mode. Such a method is described in detail with reference to Figs. 10A-12C as follows. Figs. 10A to 10D are diagrams of an example of a free layout method of a preview image in a mobile terminal according to an embodiment of the present invention. Referring to FIG. 10A, when entering a glue mode, ire to 3rd preview images 1011 to 1013 are displayed on a left region 1010 and a resolution resolution menu 1021. an image or video, which must be created in the paste mode, is displayed on a right region 1020. In particular, the menu 102 may be displayed automatically when entering the paste mode. Alternatively, in the case where a default resolution value is determined in advance, the menu 1021 may be displayed only if it is paginated by a prescribed order entry. Referring to FIG. 10B, if the first preview image 1011 is slid to the right region 1020, the first preview image is copied and then arranged as a preview image 1011 'on an endpoint of 10. sliding of the right region 1020. If at least a first preview image is arranged on an empty right region, a photograph button 1030 can be displayed If a user applies a short touch to the photograph button 1030, a full right region is saved as a single image If a user presses the photograph button 1030 a long time, the entire right region can begin to be saved as a video If an edge of the preview image 1011 'is arranged on the right region 1020 is slid, with reference to Fig. 10C, the preview image 1011 'can be changed to a preview image 1011 "having 20 does not differ in size from that of the 1011 'preview image. By performing the steps shown in Figs. 10B and 10C on a preview image desired by the user repeatedly as many times as desired, with reference to Fig. 10D, the right region can be configured as desired by the user. In this way, the user can create all the preview images arranged in paste mode into a single image or video file instead of photographing and editing a plurality of images. Moreover, instead of paging a collage mode separately, a multi-preview mode can be naturally switched to a collage mode. This is described with reference to Figs. 11A to 11B as follows.
[0037] Figs. 11A-11B are diagrams of another example of a free layout method of a preview image in a mobile terminal according to an embodiment of the present invention.
[0038] Referring to Fig. 11A, when first to third preview images 1110, 1120, and 1130 are displayed in multi-preview mode, if a user slips the preview image 1110 ', which is to be arranged in the preview mode. To a right side edge, a screen is set in a right direction so that a region corresponding to the right region 1020 shown in Fig. 10A can be displayed as the region shown in Fig. 11B. Figs. 12A-12C are diagrams of an example of a method of further arranging an image content together with a preview image in a mobile terminal according to an embodiment of the present invention. Referring to Fig. 12A, a content adding button 140 may further be displayed in a multi-preview mode in addition to the first to third preview images 1210, 1220, and 1230. If a user selects the add button 140 with reference to FIG. 12B, a content browser 1250 may be displayed. The user can select an image / video saved in a mobile device memory, an SRS or a web page via a content browser. Referring to Fig. 12C, selected content 1240 'may be displayed at a location of the content add button 1240. In this case, the user may use a separate content, which is not an image of 20. preview taken with a camera, by dragging selected 1240 'content to a prescribed edge. In this way, the user can save a previously saved video and an image taken with a camera together in a single video collage mode. Figs. 13A-13C are diagrams of an example of a process of creating a third preview image using a preview image and a second preview image in a mobile terminal according to an embodiment of the present invention. invention. Referring to FIG. 13A, when a second preview image 1310 and a preview image 1320 are displayed only in the multi-preview mode, if the preview image 1310 is dragged to the preview image 1320, a third preview image can be displayed using the second preview image 1310 as the background [Figure 13B]. On the other hand, if the first preview image 1320 is dragged to the second preview image 1310, a third preview image may be displayed using the preview image 1320 as the background [FIG. 13C]. In the situation shown in Fig. 13B or Fig. 13C, after two different points of the third preview image have been touched, if a slip is applied so that the two points touched are closer to each other. other, one can return to the state shown in Figure 13A. Figs. 14A-14B are diagrams of an example of a multi-preview mode activation process in consideration of battery saving in a mobile terminal according to an embodiment of the present invention. Referring to Fig. 14A, a preview image 1420, a second preview image 1410, and a third preview image 1430 may be displayed in multi-preview mode. And, each of the preview images can be updated in real time. In doing so, if a user 15 touches the second preview image 1410, with reference to Fig. 14B, an image corresponding to the second preview image 1410 is taken and a thumbnail 1411 of the taken image can be displayed below the second preview image 410. If a photograph of a specific preview image is made, the remainder of the preview images are fixed into images at the time of photography and are not further updated. And, a 1440 icon indicating such a state can be displayed on each of the rest of the preview images. If a user again touches the prescribed one of the remainder, the icon 1440 displayed on the affected preview image disappears and the corresponding preview image may begin to be updated again in real time. In this way, the energy consumed can be saved to simultaneously process a plurality of preview images in a mobile terminal. According to the embodiments mentioned in the preceding description, a preview before, a rear preview and a dual preview are displayed simultaneously on a touch screen of a mobile terminal, whereby a user can make a selection by looking at a screen provided to advance before selecting a specific camera mode. Unlike an existing digital camera or smartphone capable of providing only a single preview, the present invention allows a user to conveniently select a prescribed camera by viewing the provided front, back, and dual previews in real time, thereby improving the use of the camera. In addition, while a plurality of preview images are simultaneously displayed, the present invention can take a snapshot of a desired preview instantly, thereby facilitating a job such as camera switching or the like and allowing fast photography. . In addition, the present invention also provides previews to which visual effects of various types are applied in addition to front preview, back preview, and dual preview, thereby facilitating the prediction of a photograph result. . It will be apparent to those skilled in the art that various modifications and variations may be specified in other forms without departing from the spirit or scope of the inventions. According to one embodiment of the present invention, the methods described above can be implemented in a program support recorded as computer readable codes. Computer readable media includes all types of recording devices in which computer readable data is saved. Computer readable media include read-only memory, RAM, CD-ROMs, magnetic tapes, floppy disks, optical data storage devices and the like, for example, and also include carrier-type implementations (e.g., transmission over the Internet). And, the computer may include the controller 180 of the terminal. The aforementioned embodiments are accomplished by a combination of structural elements and features of the present invention in a predetermined type. Each of the structural elements or features should be considered selectively unless otherwise specified. Each of the structural elements or features can be realized without being combined with other elements or structural features. Likewise, structural elements and / or features may be combined with each other to provide the embodiments of the present invention. It will be apparent to those skilled in the art that various modifications and variations can be made in the present invention without departing from the spirit or scope of the inventions. Thus, the present invention is intended to cover the modifications and variations of the invention provided that they fall within the scope of the appended claims and their equivalents. Of course, the invention is not limited to the embodiments described above and shown, from which we can provide other modes and other embodiments, without departing from the scope of the invention. invention.
权利要求:
Claims (20)
[0001]
REVENDICATIONS1. A mobile terminal (100) comprising: a first camera configured to obtain an image; a second camera configured to obtain an image; a touch screen configured to display information; and a controller (180) configured to: cause the touch screen to simultaneously display a first preview image generated via the first camera and a second preview image generated via the second camera; and record an image or video corresponding to the at least one first preview image displayed or the second preview image displayed in response to an entry.
[0002]
The mobile terminal (100) of claim 1, wherein: the controller (180) is further configured to cause the touch screen to display a third preview image in addition to the first preview and preview image. the second preview image; and the third preview image includes the first preview image and the second preview image overlapping each other at least partially.
[0003]
The mobile terminal (100) of claim 2, wherein the controller (180) is further configured to: record an image corresponding to the first, second or third preview image in response to a first touch input type applied to the first, second or third preview image; and recording a video corresponding to the first, second or third preview image in response to a second type of touch input applied to the first, second or third preview image. 30
[0004]
The mobile terminal (100) of claim 1, wherein the controller (180) is further configured to record a first video corresponding to the first or second preview image in response to a first touch input applied. at and maintained at the first or second preview image at least for a predetermined period of time. 5
[0005]
The mobile terminal (100) of claim 4, wherein the controller (180) is further configured to: record a second video including both a first image corresponding to the first preview image and a second image corresponding to the second preview image in response to a second touch input applied to the other of the first or second preview images during the recording of the first video such that the second video is recorded while the first and second tactile inputs are simultaneously maintained, the first image and the second image being arranged to overlap each other at least partially in the second video; recording a third video corresponding to the other of the first or second preview image when the first touch input is released and the second touch input is maintained while the second video is recorded; and stop recording the third video when the second touch input is released while the third video is recorded.
[0006]
The mobile terminal (100) of claim 1, wherein the controller (180) is further configured to apply configuration values of both the first and second preview image to the first image of preview in response to a shift of the first preview image to the second preview image.
[0007]
The mobile terminal (100) according to claim 1, wherein the controller (180) is further configured to cause the touch screen to: display the first or second preview image in a full screen size in response at a first touch input applied to the first or second preview image; and 3024786 43 redisplay both the first and second preview images in response to a second touch input applied to the image displayed in the full screen size.
[0008]
The mobile terminal (100) of claim 1, wherein the controller (180) is further configured to cause the touch screen to display a photograph button in a second region of the touch screen in response. sliding at least the first or second preview image displayed in a first region of the touch screen to the second region. 10
[0009]
The mobile terminal (100) according to claim 8, wherein the controller (180) is further configured to: record an image corresponding to the second region in response to a first-type touch input applied to the photograph button while the second region includes the at least first or second slid preview image; and recording a video corresponding to the second region in response to a second type of touch input applied to the photography button while the second region includes the at least first or second slipped preview image.
[0010]
The mobile terminal (100) of claim 1, wherein the controller (180) is further configured to cause the touch screen to display the first or second preview image in response to slippage of the first or second preview image. second preview image to an edge of the touch screen so that only the first or second preview image slid is displayed. 25
[0011]
A method of controlling a mobile terminal (100), the method comprising: simultaneously displaying, on a touch screen, a first preview image generated via a first camera and a second preview image generated via a second camera; and recording an image or video corresponding to the at least first preview image displayed or second preview image displayed in response to an entry. 3024786 44
[0012]
The method of claim 11, further comprising displaying a third preview image in addition to the first preview image and the second preview image, wherein the third preview image includes the first preview image. preview and the second preview image overlapping each other at least partially.
[0013]
The method of claim 12, wherein the recording of the image or video comprises: recording an image corresponding to the first, second, or third preview image in response to a first-applied type input applied at the first, second or third preview image; and recording a video corresponding to the first, second or third preview image in response to a second type of touch input applied to the first, second or third preview image.
[0014]
The method of claim 11, wherein the recording of the image or video comprises: recording a first video corresponding to the first or second preview image in response to a first touch input applied to and maintained at the first or second preview image at least for a predetermined period of time.
[0015]
The method of claim 14, further comprising: recording a second video including both a first image corresponding to the first preview image and a second image corresponding to the second preview image in response to a first image; second touch input applied to the other of the first or second preview image during recording of the first video so that the second video is recorded while the first and second touch inputs are held simultaneously, the first image and the second image being arranged to overlap each other at least partially in the second video; Recording a third video corresponding to the other of the first or second preview image when the first touch input is released and the second touch input is maintained while the second video is recorded, the first image and the second image being arranged to overlap each other at least partially in the second video; and stopping the recording of the third video when the second touch input is released while the third video is recorded.
[0016]
The method of claim 11, further comprising: applying configuration values of both the first and second preview image to the first preview image in response to a slippage of the first preview image to the second preview image. 15
[0017]
The method of claim 11, further comprising: displaying the first or second preview image in a full screen size in response to a first touch input applied to the first or second preview image; and redisplaying both the first and second preview images in response to a second touch input applied to the first or second preview image displayed in a full screen size.
[0018]
The method of claim 11, further comprising: displaying a photograph button in a second region of the touch screen in response to a slid of at least the first or second preview image displayed in a first region of the touch screen to the second region.
[0019]
The method of claim 18, further comprising: recording an image corresponding to the second region in response to a first type touch input applied to the photography button while the second region includes the at least first or second snapshot preview image; and recording a video corresponding to the second region in response to a second type of touch input applied to the photography button while the second region includes the at least first or second slipped preview image. 5
[0020]
The method of claim 11, further comprising: displaying the first or second preview image in response to the slippage of the first or second preview image to an edge of the touch screen so that only the first or second frame of the slid preview is displayed.
类似技术:
公开号 | 公开日 | 专利标题
FR3024786A1|2016-02-12|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME
FR3021133B1|2019-08-30|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3031601B1|2019-08-30|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3026201A1|2016-03-25|
FR3021424B1|2019-09-20|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3021766A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE MOBILE TERMINAL
FR3025328B1|2019-07-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3021136A1|2015-11-20|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3022367A1|2015-12-18|
FR3021767A1|2015-12-04|MOBILE TERMINAL AND METHOD FOR CONTROLLING THE SAME
FR3021135A1|2015-11-20|
FR3021134A1|2015-11-20|MOBILE TERMINAL
FR3021485A1|2015-11-27|MOBILE DEVICE AND METHOD OF CONTROLLING THE SAME
FR3039673A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3019665A1|2015-10-09|
FR3022649A1|2015-12-25|
FR3043478A1|2017-05-12|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
FR3021425A1|2015-11-27|
US20160054567A1|2016-02-25|Mobile terminal, glasses-type terminal, and mutual interworking method using screens thereof
FR3046470B1|2019-11-08|MOBILE TERMINAL
FR3039674A1|2017-02-03|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
US20180107360A1|2018-04-19|Mobile terminal and method for controlling the same
FR3022648A1|2015-12-25|
FR3041785A1|2017-03-31|MOBILE TERMINAL AND METHOD OF CONTROLLING THE SAME
US9927967B2|2018-03-27|Mobile terminal and method for controlling the same
同族专利:
公开号 | 公开日
CN105376396B|2019-12-03|
FR3024786B1|2018-08-31|
KR20160018001A|2016-02-17|
EP2983355A1|2016-02-10|
US20160044235A1|2016-02-11|
CN105376396A|2016-03-02|
US9729781B2|2017-08-08|
KR102216246B1|2021-02-17|
引用文献:
公开号 | 申请日 | 公开日 | 申请人 | 专利标题
US20070279482A1|2006-05-31|2007-12-06|Motorola Inc|Methods and devices for simultaneous dual camera video telephony|
EP2448278A2|2010-11-01|2012-05-02|Lg Electronics Inc.|Mobile terminal and method of controlling an image photographing therein|
KR101328950B1|2007-04-24|2013-11-13|엘지전자 주식회사|Image display method and image communication terminal capable of implementing the same|
EP2393000B1|2010-06-04|2019-08-07|Lg Electronics Inc.|Mobile terminal capable of providing multiplayer game and method of controlling operation of the mobile terminal|
KR101674959B1|2010-11-02|2016-11-10|엘지전자 주식회사|Mobile terminal and Method for controlling photographing image thereof|
CN103945045A|2013-01-21|2014-07-23|联想有限公司|Method and device for data processing|
KR20140098009A|2013-01-30|2014-08-07|삼성전자주식회사|Method and system for creating a context based camera collage|
KR102076771B1|2013-02-21|2020-02-12|삼성전자주식회사|Image capturing using multiple screen sections|
US9318149B2|2013-03-01|2016-04-19|Gvbb Holdings S.A.R.L.|Method and system of composite broadcast control|KR102032541B1|2013-02-26|2019-11-08|삼성전자주식회사|Apparatus and method for processing a image in device|
US10078444B2|2013-06-25|2018-09-18|Lg Electronics Inc.|Mobile terminal and method for controlling mobile terminal|
JP5987931B2|2015-02-09|2016-09-07|株式会社リコー|Video display system, information processing apparatus, video display method, video display program, video processing apparatus, video processing method, and video processing program|
US20160366323A1|2015-06-15|2016-12-15|Mediatek Inc.|Methods and systems for providing virtual lighting|
JP6643008B2|2015-08-26|2020-02-12|キヤノン株式会社|Image processing apparatus, control method thereof, control program, and storage medium|
CN105549879A|2015-12-08|2016-05-04|联想有限公司|Information processing method and electronic equipment|
KR20170110837A|2016-03-24|2017-10-12|주식회사 하이딥|Mobile terminal capable of easily converting photographing modes and photographing mode converting method|
CN105979156A|2016-06-30|2016-09-28|维沃移动通信有限公司|Panoramically photographing method and mobile terminal|
DE112016007363T5|2016-10-20|2019-07-11|Symbol Technologies, Llc|Mobile device with edge activation|
CN106572306A|2016-10-28|2017-04-19|北京小米移动软件有限公司|Image shooting method and electronic equipment|
JP6808480B2|2016-12-27|2021-01-06|キヤノン株式会社|Imaging control device and its control method|
JP6765956B2|2016-12-27|2020-10-07|キヤノン株式会社|Imaging control device and its control method|
JP6833507B2|2016-12-27|2021-02-24|キヤノン株式会社|Imaging control device and its control method|
JP6833505B2|2016-12-27|2021-02-24|キヤノン株式会社|Imaging control device and its control method|
CN106713716B|2017-01-23|2020-03-31|努比亚技术有限公司|Shooting control method and device for double cameras|
CN107040723B|2017-04-28|2020-09-01|努比亚技术有限公司|Imaging method based on double cameras, mobile terminal and storage medium|
US10542245B2|2017-05-24|2020-01-21|Lg Electronics Inc.|Mobile terminal and method for controlling the same|
CN107395969B|2017-07-26|2019-12-03|维沃移动通信有限公司|A kind of image pickup method and mobile terminal|
CN108804628B|2018-05-31|2021-05-07|维沃移动通信有限公司|Picture display method and terminal|
CN110035230A|2019-04-15|2019-07-19|珠海格力电器股份有限公司|A kind of picture display control method, system and intelligent terminal based on Folding screen|
CN110971823B|2019-11-29|2021-06-29|维沃移动通信(杭州)有限公司|Parameter adjusting method and terminal equipment|
法律状态:
2016-05-30| PLFP| Fee payment|Year of fee payment: 2 |
2017-05-30| PLFP| Fee payment|Year of fee payment: 3 |
2018-01-19| PLSC| Publication of the preliminary search report|Effective date: 20180119 |
2018-05-29| PLFP| Fee payment|Year of fee payment: 4 |
2019-04-10| PLFP| Fee payment|Year of fee payment: 5 |
2020-04-08| PLFP| Fee payment|Year of fee payment: 6 |
2022-02-11| ST| Notification of lapse|Effective date: 20220105 |
优先权:
申请号 | 申请日 | 专利标题
KR20140101834|2014-08-07|
KR1020140101834A|KR102216246B1|2014-08-07|2014-08-07|Mobile terminal and method for controlling the same|
[返回顶部]